Adequate input for learning in attractor neuralnetworksDaniel

نویسنده

  • Daniel J. Amit
چکیده

In the context of learning in attractor neural networks (ANN) we discuss the issue of the constraints imposed by the requirement that the aaerents arriving at the neurons in the attractor network from the stimulus, compete successfully with the aaerents generated by the recurrent activity inside the network, in a situation in which the both sets of synaptic eecacies are weak and approximately equal. We simulate and analyze a two component network: one representing the stimulus, the other an ANN. We show that if stimuli are correlated with the receptive elds of neurons in the ANN, and are of suucient contrast, the stimulus can provide the necessary information to the recurrent network to allow learning new stimuli, even in very disfavored situation of synaptic predominance in the recurrent part. Stimuli which are insuuciently correlated with the receptive elds, or are of insuucient contrast, are submerged by the recurrent activity.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Attractor Memory with Self-organizing Input

We propose a neural network based autoassociative memory system for unsupervised learning. This system is intended to be an example of how a general information processing architecture, similar to that of neocortex, could be organized. The neural network has its units arranged into two separate groups called populations, one input and one hidden population. The units in the input population for...

متن کامل

Theory of Input Driven Dynamical Systems

Most dynamic models of interest in machine learning, robotics, AI or cognitive science are nonautonomous and input-driven. In the last few years number of important innovations have occurred in mathematical research on nonautonomous systems. In understanding the long term behavior of nonautonomous systems, the notion of an attractor is fundamental. With a time varying input, it turns out that f...

متن کامل

Recurrent Learning of Input-output Stable Behaviour in Function Space: a Case Study with the Roessler Attractor

We analyse the stability of the input-output behaviour of a recurrent network. It is trained to implement an operator implicitly given by the chaotic dynamics of the Roessler attractor. Two of the attractors coordinate functions are used as network input and the third defines the reference output. Using recently developed new methods we show that the trained network is input-output stable and c...

متن کامل

A Signature of Attractor Dynamics in the CA3 Region of the Hippocampus

The notion of attractor networks is the leading hypothesis for how associative memories are stored and recalled. A defining anatomical feature of such networks is excitatory recurrent connections. These "attract" the firing pattern of the network to a stored pattern, even when the external input is incomplete (pattern completion). The CA3 region of the hippocampus has been postulated to be such...

متن کامل

Prototype Extraction in Material Attractor Neural Network with Stochastic Dynamic Learning

Dynamic learning of random stimuli can be described as a random walk among the stable synaptic values. It is shown that prototype extraction can take place in material attractor neural networks when the stimuli are correlated and hierarchically organized. The network learns a set of attractors representing the prototypes in a completely unsupervised fashion and is able to modify its attractors ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 1993